منابع مشابه
A Wrapper Method for Cost-Sensitive Learning via Stratification
Many machine learning applications require classifiers that minimize an asymmetric loss function rather than the raw misclassification rate. We introduce a wrapper method for data stratification to incorporate arbitrary cost matrices into learning algorithms. One way to implement stratification for C4.5 decision tree learners is to manipulate the weights assigned to the examples from different ...
متن کاملHomotopy Parametric Simplex Method for Sparse Learning
High dimensional sparse learning has imposed a great computational challenge to large scale data analysis. In this paper, we are interested in a broad class of sparse learning approaches formulated as linear programs parametrized by a regularization factor, and solve them by the parametric simplex method (PSM). Our parametric simplex method offers significant advantages over other competing met...
متن کاملA New IRIS Segmentation Method Based on Sparse Representation
Iris recognition is one of the most reliable methods for identification. In general, itconsists of image acquisition, iris segmentation, feature extraction and matching. Among them, iris segmentation has an important role on the performance of any iris recognition system. Eyes nonlinear movement, occlusion, and specular reflection are main challenges for any iris segmentation method. In thi...
متن کاملA New IRIS Segmentation Method Based on Sparse Representation
Iris recognition is one of the most reliable methods for identification. In general, itconsists of image acquisition, iris segmentation, feature extraction and matching. Among them, iris segmentation has an important role on the performance of any iris recognition system. Eyes nonlinear movement, occlusion, and specular reflection are main challenges for any iris segmentation method. In thi...
متن کاملA Sparse Parameter Learning Method for Probabilistic Logic Programs
We propose a new parameter learning algorithm for ProbLog, which is an extension of a logic program that can perform probabilistic inferences. Our algorithm differs from previous parameter learning algorithms for probabilistic logic program (PLP) models on the point that it tries to reduce the number of probabilistic parameters contained in the estimated program. Since the amount of computation...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
ژورنال
عنوان ژورنال: SSRN Electronic Journal
سال: 2020
ISSN: 1556-5068
DOI: 10.2139/ssrn.3633843